نتائج البحث

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
تم إضافة الكتاب إلى الرف الخاص بك!
عرض الكتب الموجودة على الرف الخاص بك .
وجه الفتاة! هناك خطأ ما.
وجه الفتاة! هناك خطأ ما.
أثناء محاولة إضافة العنوان إلى الرف ، حدث خطأ ما :( يرجى إعادة المحاولة لاحقًا!
هل أنت متأكد أنك تريد إزالة الكتاب من الرف؟
{{itemTitle}}
{{itemTitle}}
وجه الفتاة! هناك خطأ ما.
وجه الفتاة! هناك خطأ ما.
أثناء محاولة إزالة العنوان من الرف ، حدث خطأ ما :( يرجى إعادة المحاولة لاحقًا!
    منجز
    مرشحات
    إعادة تعيين
  • الضبط
      الضبط
      امسح الكل
      الضبط
  • مُحَكَّمة
      مُحَكَّمة
      امسح الكل
      مُحَكَّمة
  • السلسلة
      السلسلة
      امسح الكل
      السلسلة
  • مستوى القراءة
      مستوى القراءة
      امسح الكل
      مستوى القراءة
  • السنة
      السنة
      امسح الكل
      من:
      -
      إلى:
  • المزيد من المرشحات
      المزيد من المرشحات
      امسح الكل
      المزيد من المرشحات
      نوع المحتوى
    • نوع العنصر
    • لديه النص الكامل
    • الموضوع
    • الناشر
    • المصدر
    • المُهدي
    • اللغة
    • مكان النشر
    • المؤلفين
    • الموقع
202,002 نتائج ل "Value analysis"
صنف حسب:
An integrated 1D–2D hydraulic modelling approach to assess the sensitivity of a coastal region to compound flooding hazard under climate change
Coastal regions are dynamic areas that often lie at the junction of different natural hazards. Extreme events such as storm surges and high precipitation are significant sources of concern for flood management. As climatic changes and sea-level rise put further pressure on these vulnerable systems, there is a need for a better understanding of the implications of compounding hazards. Recent computational advances in hydraulic modelling offer new opportunities to support decision-making and adaptation. Our research makes use of recently released features in the HEC-RAS version 5.0 software to develop an integrated 1D–2D hydrodynamic model. Using extreme value analysis with the Peaks-Over-Threshold method to define extreme scenarios, the model was applied to the eastern coast of the UK. The sensitivity of the protected wetland known as the Broads to a combination of fluvial, tidal and coastal sources of flooding was assessed, accounting for different rates of twenty-first century sea-level rise up to the year 2100. The 1D–2D approach led to a more detailed representation of inundation in coastal urban areas, while allowing for interactions with more fluvially dominated inland areas to be captured. While flooding was primarily driven by increased sea levels, combined events exacerbated flooded area by 5–40% and average depth by 10–32%, affecting different locations depending on the scenario. The results emphasise the importance of catchment-scale strategies that account for potentially interacting sources of flooding.
A probabilistic gridded product for daily precipitation extremes over the United States
Gridded data products, for example interpolated daily measurements of precipitation from weather stations, are commonly used as a convenient substitute for direct observations because these products provide a spatially and temporally continuous and complete source of data. However, when the goal is to characterize climatological features of extreme precipitation over a spatial domain (e.g., a map of return values) at the native spatial scales of these phenomena, then gridded products may lead to incorrect conclusions because daily precipitation is a fractal field and hence any smoothing technique will dampen local extremes. To address this issue, we create a new “probabilistic” gridded product specifically designed to characterize the climatological properties of extreme precipitation by applying spatial statistical analysis to daily measurements of precipitation from the Global Historical Climatology Network over the contiguous United States. The essence of our method is to first estimate the climatology of extreme precipitation based on station data and then use a data-driven statistical approach to interpolate these estimates to a fine grid. We argue that our method yields an improved characterization of the climatology within a grid cell because the probabilistic behavior of extreme precipitation is much better behaved (i.e., smoother) than daily weather. Furthermore, the spatial smoothing innate to our approach significantly increases the signal-to-noise ratio in the estimated extreme statistics relative to an analysis without smoothing. Finally, by deriving a data-driven approach for translating extreme statistics to a spatially complete grid, the methodology outlined in this paper resolves the issue of how to properly compare station data with output from earth system models. We conclude the paper by comparing our probabilistic gridded product with a standard extreme value analysis of the Livneh gridded daily precipitation product. Our new data product is freely available on the Harvard Dataverse ( https://bit.ly/2CXdnuj ).
INLA goes extreme: Bayesian tail regression for the estimation of high spatio-temporal quantiles
This work is motivated by the challenge organized for the 10th International Conference on Extreme-Value Analysis (EVA2017) to predict daily precipitation quantiles at the 99.8 % level for each month at observed and unobserved locations. Our approach is based on a Bayesian generalized additive modeling framework that is designed to estimate complex trends in marginal extremes over space and time. First, we estimate a high non-stationary threshold using a gamma distribution for precipitation intensities that incorporates spatial and temporal random effects. Then, we use the Bernoulli and generalized Pareto (GP) distributions to model the rate and size of threshold exceedances, respectively, which we also assume to vary in space and time. The latent random effects are modeled additively using Gaussian process priors, which provide high flexibility and interpretability. We develop a penalized complexity (PC) prior specification for the tail index that shrinks the GP model towards the exponential distribution, thus preventing unrealistically heavy tails. Fast and accurate estimation of the posterior distributions is performed thanks to the integrated nested Laplace approximation (INLA). We illustrate this methodology by modeling the daily precipitation data provided by the EVA2017 challenge, which consist of observations from 40 stations in the Netherlands recorded during the period 1972–2016. Capitalizing on INLA’s fast computational capacity and powerful distributed computing resources, we conduct an extensive cross-validation study to select the model parameters that govern the smoothness of trends. Our results clearly outperform simple benchmarks and are comparable to the best-scoring approaches of the other teams.
Philosophy of digital currencies : a theory of monetising time
\"Current systems are failing the poor because these systems are unable to provide the financial inclusion needed for basic subsistence and commerce, which in turn would drive micro and macro-economic growth. This book introduces the reader to a new way of thinking about how value can be created, captured, measured and understood, economically and financially, and within in the context of social contract. It underscores the need to revisit such models through technological advancements, namely Industrial Revolution 4.0, in order to solve pressing global issues like economic inclusion and poverty eradication. The book proposes that for humanity to make the leap forward and for any real sustainable development to occur, the world needs a disruptive approach to value creation using currency systems, considering that currencies underpin value exchange. This disruption will result in a level of decentralization that facilitates peer-to-peer value exchange and drives financial inclusion, all of which should be underscored by a new, digital social contract. The author asserts that a time-based digital currency could address these issues by creating a new and truly inclusive currency model that allows economies to gain more value than previously possible. In addition, by leveraging 4IR technologies, a currency system can be designed where each unit of money accurately reflects the context and range of socio-economic factors that influence each human interaction. This book is aimed at futurists, technologists, researchers, policymakers, and anyone that is curious about how technology could make a difference in our collective futures. It cuts across a range of subject areas from economics, finance, philosophy, innovation to social development and takes an interdisciplinary approach to present a logical framework and theoretical foundation for the monetization of time as a digital currency\"-- Provided by publisher.
Use of multi‐perturbation Shapley analysis in lesion studies of functional networks: The case of upper limb paresis
Understanding the impact of variation in lesion topography on the expression of functional impairments following stroke is important, as it may pave the way to modeling structure–function relations in statistical terms while pointing to constraints for adaptive remapping and functional recovery. Multi‐perturbation Shapley‐value analysis (MSA) is a relatively novel game‐theoretical approach for multivariate lesion‐symptom mapping. In this methodological paper, we provide a comprehensive explanation of MSA. We use synthetic data to assess the method's accuracy and perform parameter optimization. We then demonstrate its application using a cohort of 107 first‐event subacute stroke patients, assessed for upper limb (UL) motor impairment (Fugl‐Meyer Assessment scale). Under the conditions tested, MSA could correctly detect simulated ground‐truth lesion‐symptom relationships with a sensitivity of 75% and specificity of ~90%. For real behavioral data, MSA disclosed a strong hemispheric effect in the relative contribution of specific regions‐of‐interest (ROIs): poststroke UL motor function was mostly contributed by damage to ROIs associated with movement planning (supplementary motor cortex and superior frontal gyrus) following left‐hemispheric damage (LHD) and by ROIs associated with movement execution (primary motor and somatosensory cortices and the ventral brainstem) following right‐hemispheric damage (RHD). Residual UL motor ability following LHD was found to depend on a wider array of brain structures compared to the residual motor ability of RHD patients. The results demonstrate that MSA can provide a unique insight into the relative importance of different hubs in neural networks, which is difficult to obtain using standard univariate methods. In this methodological paper, we described in detail a newly revised multivariate approach to lesion‐symptom mapping based on the game‐theoretical principles of multi‐perturbation Shapley‐value analysis. Using as a test case, a data set of 107 stroke patients, we showed the ability of this revised approach to correctly detect ground‐truth brain–behavior relationship using moderately sized cohorts (50–60 patients), with acceptable type‐I and type‐II error rates. The results obtained by this method can provide useful information regarding the underlying brain network supporting the behavior of interest.
C-O-S-T
Why do companies exert high effort to reduce the costs of products that are production? Because they can! Because unnecessary product costs were not removed during product development.C-O-S-T, short for Cost Optimization System and Technique, details how a company's product development teams, their supporting functions, and company leaders can optimize product costs before production starts and thereby maximize lifecycle profits.Since product development teams determine product costs imparted to new products, much of the book details how these teams optimize product costs. The book also includes ways company leaders can create and sustain company-wide engagement in optimizing product costs and keeping the resulting increased profit margins.The reader is entertained while observing a three-day workshop where executives of a fictitious company, Defender Products, Inc. are being taught the C-O-S-T system by its developers. The story flows like a business workshop with slides, dialog, and break-out sessions.The content will benefit all companies that design, develop and manufacture products.